Generalization Error and the Expected Network Complexity

نویسنده

  • Chuanyi Ji
چکیده

For two layer networks with n sigmoidal hidden units, the generalization error is shown to be bounded by O(E~) O( (EK)d l N) K + N og , where d and N are the input dimension and the number of training samples, respectively. E represents the expectation on random number K of hidden units (1 :::; I\ :::; n). The probability Pr(I{ = k) (1 :::; k :::; n) is (kt.erl11ined by a prior distribution of weights, which corresponds to a Gibbs distribtt! ion of a regularizeI'. This relationship makes it possible to characterize explicitly how a regularization term affects bias/variance of networks. The bound can be obtained analytically for a large class of commonly used priors. It can also be applied to estimate the expected net.work complexity Ef{ in practice. The result provides a quantitative explanation on how large networks can generalize well .

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Modeling of measurement error in refractive index determination of fuel cell using neural network and genetic algorithm

Abstract: In this paper, a method for determination of refractive index in membrane of fuel cell on basis of three-longitudinal-mode laser heterodyne interferometer is presented. The optical path difference between the target and reference paths is fixed and phase shift is then calculated in terms of refractive index shift. The measurement accuracy of this system is limited by nonlinearity erro...

متن کامل

Flat Minima

We present a new algorithm for finding low-complexity neural networks with high generalization capability. The algorithm searches for a "flat" minimum of the error function. A flat minimum is a large connected region in weight space where the error remains approximately constant. An MDL-based, Bayesian argument suggests that flat minima correspond to "simple" networks and low expected overfitti...

متن کامل

On the training error and generalization error of neural network regression without identifiability

In this article, we analyzed the expected training error and the expected generalization error for neural networks in unidentifiable case, in which a set of output data is assumed to be a Gaussian noise sequence. Firstly, the results on the bounds of the expected training error and the expected generalization error for a general neural networks are reviewed. Secondly, we gave the order of the e...

متن کامل

Regularization Learning of Neural Networks for Generalization

In this paper, we propose a learning method of neural networks based on the regularization method and analyze its generalization capability. In learning from examples, training samples are independently drawn from some unknown probability distribution. The goal of learning is minimizing the expected risk for future test samples, which are also drawn from the same distribution. The problem can b...

متن کامل

Chaitin-Kolmogorov Complexity and Generalization in Neural Networks

We present a unified framework for a number of different ways of failing to generalize properly. During learning, sources of random information contaminate the network, effectively augmenting the training data with random information. The complexity of the function computed is therefore increased, and generalization is degraded. We analyze replicated networks, in which a number of identical net...

متن کامل

Empirical Estimation of Concept Compressibility and Generalization Error

Abstract. In this paper we focus on the relationship between concept complexity and generalization error of learned concept descriptions. After showing a simple way to estimate concept complexity from learning data, we empirically link this estimation to the generalization error of different learning algorithms. While it is expected that the error increases with the complexity of the concept, q...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1993